Probabilistic Incremental Rule Learning

نویسنده

  • Gregory Weber
چکیده

This paper describes PISCES 1.2E, a system for incremental learning of probabilistic rules. PISCES is efficiently incremental in the sense that both its processing time per instance and its memory usage are independent of the number of training instances. Classification accuracy alone does not provide a sufficient measure of performance for probabilistic classifiers. Additional measures include extrinsic confidence (EC), which is the average degree to which actual events are unsurprising; and intrinsic confidence (IC) and entropy, which measure certainty. EC and IC are also useful as heuristic functions in the search for concept descriptions. PISCES achieves classification accuracy nearly as high as that of a non-incremental rule learning system, and significantly better performance according to the other three measures.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Probabilistic categorization: how do normal participants and amnesic patients do it?

In probabilistic categorization tasks, various cues are probabilistically (but not perfectly) predictive of class membership. This means that a given combination of cues sometimes belongs to one class and sometimes to another. It is not yet clear how categorizers approach such tasks. Here, we review evidence in favor of two alternative conceptualizations of learning in probabilistic categorizat...

متن کامل

Probabilistic versus Incremental Presynaptic Learning in Biologically Plausible Synapses

In this paper, the presynaptic rule, a classical rule for hebbian learning, is revisited. It is shown that the presynaptic rule exhibits relevant synaptic properties like synaptic directionality, and LTP metaplasticity (long-term potentiation threshold metaplasticity). With slight modifications, the presynaptic model also exhibits metaplasticity of the long-term depression threshold, being also...

متن کامل

Playing in continuous spaces: some analysis and extension of population-based incremental learning

As an alternative to traditional Evolutionary Algorithms (EAs), Population-Based Incremental Learning (PBIL) maintains a probabilistic model of the best individual(s). Originally, PBIL was applied in binary search spaces. Recently, some work has been done to extend it to continuous spaces. In this paper, we review two such extensions of PBIL. An improved version of the PBIL based on Gaussian mo...

متن کامل

A diversity maintaining population-based incremental learning algorithm

In this paper we propose a new probability update rule and sampling procedure for population-based incremental learning. These proposed methods are based on the concept of opposition as a means for controlling the amount of diversity within a given sample population. We prove that under this scheme we are able to asymptotically guarantee a higher diversity, which allows for a greater exploratio...

متن کامل

A high capacity incremental and local learning algorithm for attractor neural networks

Attractor neural networks such as the Hop eld network can be trained by a number of algorithms. The simplest of these methods is the Hebb rule which is strictly local (the weight of the synapse depends only on the activation of the two neurons it connects) and incremental (adding a new memory can be done knowing only the old weight matrix and not the actual patterns previously stored). Both of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003